Add docker-style progress bars and resumable downloads to llama-server -dr #42
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR implements docker-style progress bars and resumable download functionality for the
llama-server -drflag, which downloads models from Docker Hub using OCI protocol.Problem
Previously, when using
llama-server -drto download models from Docker Hub:Solution
1. Docker-Style Progress Bars
Added real-time progress display showing:
Progress updates every 100ms for smooth visual feedback and automatically detects TTY to only show on interactive terminals.
Example:
2. Resumable Downloads
Implemented automatic resume capability:
.tmpextension.digestfile for integrity verificationio.CopyNExample resume:
3. Enhanced Cache Validation
Technical Details
Implementation
progressWriterstruct in Go that wrapsio.Writersync/atomicfor thread-safe progress trackingFiles Modified
oci-go/oci.go(+180 lines): Core implementationdocs/oci-registry.md(+43 lines): Documentation updatesCode Quality
Benefits
Backward Compatibility
Fully backward compatible:
Usage
No changes required - existing
-drflag now includes progress and resume:Warning
Firewall rules blocked me from connecting to one or more addresses (expand for details)
I tried to connect to the following addresses, but was blocked by firewall rules:
ggml.ai/home/REDACTED/work/llama.cpp/llama.cpp/build/bin/test-arg-parser(dns block)If you need me to access, download, or install something from one of these locations, you can either:
Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.